Recurrent Neural Network Compression Based on Low-Rank Tensor Representation
نویسندگان
چکیده
منابع مشابه
Low-rank Representation for Enhanced Deep Neural Network Acoustic Models
Automatic speech recognition (ASR) is a fascinating area of research towards realizing humanmachine interactions. After more than 30 years of exploitation of Gaussian Mixture Models (GMMs), state-of-the-art systems currently rely on Deep Neural Network (DNN) to estimate class-conditional posterior probabilities. The posterior probabilities are used for acoustic modeling in hidden Markov models ...
متن کاملDimensionality Reduction Based on Low Rank Representation
Dimensionality Reduction is a common way to solve the problem of ‘curse of dimensions’, especially for image processing. Among all these methods, the linear methods are believed to have better performance in actual databases. This paper proposes a novel unsupervised linear dimensionality reduction method that based on low rank representation which aims at finding the subspace structure of the o...
متن کاملTensor Decomposition for Compressing Recurrent Neural Network
In the machine learning fields, Recurrent Neural Network (RNN) has become a popular algorithm for sequential data modeling. However, behind the impressive performance, RNNs require a large number of parameters for both training and inference. In this paper, we are trying to reduce the number of parameters and maintain the expressive power from RNN simultaneously. We utilize several tensor decom...
متن کاملNeuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion
In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...
متن کاملNeural network representation of tensor network and chiral states
We study the representational power of a Boltzmann machine (a type of neural network) in quantum many-body systems. We prove that any (local) tensor network state has a (local) neural network representation. The construction is almost optimal in the sense that the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEICE Transactions on Information and Systems
سال: 2020
ISSN: 0916-8532,1745-1361
DOI: 10.1587/transinf.2019edp7040